Skip to content

Conversation

@maxdebayser
Copy link
Contributor

@maxdebayser maxdebayser commented Feb 26, 2025

FIX #13880
FIX #13881

In the code below, field_names end up being shared by different classes when they are related by inheritance, in this case: ChatCompletionLogProbsContent(ChatCompletionLogProb). Since the cache is built only once, if the first model that is validated comes from the super class, the cache will be wrong for the derived class from then on. Redefining the field_names seems to prevent this from happening. Another way is to break up the inheritance and add all fields from ChatCompletionLogProb in ChatCompletionLogProbsContent.

class OpenAIBaseModel(BaseModel):
    # OpenAI API does allow extra fields
    model_config = ConfigDict(extra="allow")

    # Cache class field names
    field_names: ClassVar[Optional[Set[str]]] = None

    @model_validator(mode="wrap")
    @classmethod
    def __log_extra_fields__(cls, data, handler):
        result = handler(data)
        if not isinstance(data, dict):
            return result
        field_names = cls.field_names
        if field_names is None:
            # Get all class field names and their potential aliases
            field_names = set()
            for field_name, field in cls.model_fields.items():
                field_names.add(field_name)
                if alias := getattr(field, 'alias', None):
                    field_names.add(alias)
            cls.field_names = field_names

        # Compare against both field names and aliases
        if any(k not in field_names for k in data):
            logger.warning(
                "The following fields were present in the request "
                "but ignored: %s",
                data.keys() - field_names)
        return result

cc: @njhill

Signed-off-by: Max de Bayser <mbayser@br.ibm.com>
@github-actions
Copy link

👋 Hi! Thank you for contributing to the vLLM project.

💬 Join our developer Slack at https://slack.vllm.ai to discuss your PR in #pr-reviews, coordinate on features in #feat- channels, or join special interest groups in #sig- channels.

Just a reminder: PRs would not trigger full CI run by default. Instead, it would only run fastcheck CI which starts running only a small and essential subset of CI tests to quickly catch errors. You can run other CI tests on top of those by going to your fastcheck build on Buildkite UI (linked in the PR checks section) and unblock them. If you do not have permission to unblock, ping simon-mo or khluu to add you in our Buildkite org.

Once the PR is approved and ready to go, your PR reviewer(s) can run CI to test the changes comprehensively before merging.

To run CI, PR reviewers can either: Add ready label to the PR or enable auto-merge.

🚀

@mergify mergify bot added the frontend label Feb 26, 2025
Signed-off-by: Max de Bayser <mbayser@br.ibm.com>
Signed-off-by: Max de Bayser <mbayser@br.ibm.com>
@vrdn-23
Copy link
Contributor

vrdn-23 commented Mar 24, 2025

@maxdebayser Is this PR still being merged in? I'm seeing the same warning in the latest version as well
cc @DarkLight1337

@maxdebayser
Copy link
Contributor Author

@vrdn-23 , no it wasn't.

Copy link
Member

@DarkLight1337 DarkLight1337 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The tests pass now so it should be fine to merge, sorry for forgetting about this!

@DarkLight1337 DarkLight1337 enabled auto-merge (squash) March 25, 2025 04:31
@github-actions github-actions bot added the ready ONLY add when PR is ready to merge/full CI is needed label Mar 25, 2025
@DarkLight1337 DarkLight1337 merged commit e977c11 into vllm-project:main Mar 25, 2025
34 checks passed
wrmedford pushed a commit to wrmedford/vllm that referenced this pull request Mar 26, 2025
…roject#13925)

Signed-off-by: Max de Bayser <mbayser@br.ibm.com>
Signed-off-by: Wes Medford <wryanmedford@gmail.com>
lulmer pushed a commit to lulmer/vllm that referenced this pull request Apr 7, 2025
…roject#13925)

Signed-off-by: Max de Bayser <mbayser@br.ibm.com>
Signed-off-by: Louis Ulmer <ulmerlouis@gmail.com>
lk-chen pushed a commit to lk-chen/vllm that referenced this pull request Apr 29, 2025
shreyankg pushed a commit to shreyankg/vllm that referenced this pull request May 3, 2025
RichardoMrMu pushed a commit to RichardoMrMu/vllm that referenced this pull request May 12, 2025
…roject#13925)

Signed-off-by: Max de Bayser <mbayser@br.ibm.com>
Signed-off-by: Mu Huai <tianbowen.tbw@antgroup.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

frontend ready ONLY add when PR is ready to merge/full CI is needed

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[Bug]: vllm v0.7.3 - The following fields were present in the request but ignored: {'top_logprobs'} [Bug]: top_logrpobs generating a WARNING

3 participants